Class degree and relative maximal entropy

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Measures of Maximal Relative Entropy

Given an irreducible subshift of finite type X , a subshift Y , a factor map π : X → Y , and an ergodic invariant measure ν on Y , there can exist more than one ergodic measure on X which projects to ν and has maximal entropy among all measures in the fiber, but there is an explicit bound on the number of such maximal entropy preimages.

متن کامل

Measures of maximal entropy

We extend the results of Walters on the uniqueness of invariant measures with maximal entropy on compact groups to an arbitrary locally compact group. We show that the maximal entropy is attained at the left Haar measure and the measure of maximal entropy is unique.

متن کامل

Maximal entropy random networks with given degree distribution

Using a maximum entropy principle to assign a statistical weight to any graph, we introduce a model of random graphs with arbitrary degree distribution in the framework of standard statistical mechanics. We compute the free energy and the distribution of connected components. We determine the size of the percolation cluster above the percolation threshold. The conditional degree distribution on...

متن کامل

Rate Distortion Function for a Class of Relative Entropy Sources

This paper deals with rate distortion or source coding with fidelity criterion, in measure spaces, for a class of source distributions. The class of source distributions is described by a relative entropy constraint set between the true and a nominal distribution. The rate distortion problem for the class is thus formulated and solved using minimax strategies, which result in robust source codi...

متن کامل

Relative Entropy and Statistics

My greatest concern was what to call it. I thought of calling it “information”, but the word was overly used, so I decided to call it “uncertainty”. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already h...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Transactions of the American Mathematical Society

سال: 2012

ISSN: 0002-9947,1088-6850

DOI: 10.1090/s0002-9947-2012-05637-6